The Entropy Power Inequality and Mrs. Gerber's Lemma for Abelian Groups of Order 2^n

نویسندگان

  • Varun Jog
  • Venkat Anantharam
چکیده

Shannon’s Entropy Power Inequality can be viewed as characterizing the minimum differential entropy achievable by the sum of two independent random variables with fixed differential entropies. The entropy power inequality has played a key role in resolving a number of problems in information theory. It is therefore interesting to examine the existence of a similar inequality for discrete random variables. In this paper we obtain an entropy power inequality for random variables taking values in an abelian group of order 2, i.e. for such a group G we explicitly characterize the function fG(x, y) giving the minimum entropy of the sum of two independent G-valued random variables with respective entropies x and y. Random variables achieving the extremum in this inequality are thus the analogs of Gaussians in this case, and these are also determined. It turns out that fG(x, y) is convex in x for fixed y and, by symmetry, convex in y for fixed x. This is a generalization to abelian groups of order 2 of the result known as Mrs. Gerber’s Lemma.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Entropy Power Inequality and Mrs. Gerber's Lemma for groups of order 2n

Shannon’s Entropy Power Inequality (EPI) can be viewed as characterizing the minimum differential entropy achievable by the sum of two independent random variables with fixed differential entropies. The EPI is a powerful tool and has been used to resolve a number of problems in information theory. In this paper we examine the existence of a similar entropy inequality for discrete random variabl...

متن کامل

Minimum MS. E. Gerber's Lemma

—Mrs. Gerber's Lemma lower bounds the entropy at the output of a binary symmetric channel in terms of the entropy of the input process. In this paper, we lower bound the output entropy via a different measure of input uncertainty, pertaining to the minimum mean squared error (MMSE) prediction cost of the input process. We show that in many cases our bound is tighter than the one obtained from M...

متن کامل

Generalization of Mrs. Gerber's Lemma

Mrs. Gerber’s Lemma (MGL) hinges on the convexity of H(p ∗ H(u)), where H(u) is the binary entropy function. In this work, we prove thatH(p∗f(u)) is convex in u for every p ∈ [0, 1] provided H(f(u)) is convex in u, where f(u) : (a, b) → [0, 1 2 ]. Moreover, our result subsumes MGL and simplifies the original proof. We show that the generalized MGL can be applied in binary broadcast channel to s...

متن کامل

On the Exponent of Triple Tensor Product of p-Groups

The non-abelian tensor product of groups which has its origins in algebraic K-theory as well as inhomotopy theory, was introduced by Brown and Loday in 1987. Group theoretical aspects of non-abelian tensor products have been studied extensively. In particular, some studies focused on the relationship between the exponent of a group and exponent of its tensor square. On the other hand, com...

متن کامل

Inequalities for Sums of Joint Entropy Based on the Strong Subadditivity

In what follows, V = {1, . . . , n} is the set of indices of given random variables X1, . . . , Xn, and B = {B1, . . . , Bm} is a set of subsets (possibly with repeat) of V . Furthermore, for S = {i1, . . . , il} ⊆ V , XS and H(XS) denote the random vector (Xi1 , . . . , Xil) and its Shannon entropy H(Xi1 , . . . , Xil) (H(X∅) = 0). The power set (the set of all subsets) and the set of all l-su...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1207.6355  شماره 

صفحات  -

تاریخ انتشار 2012